Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add filters

Main subject
Language
Year range
1.
2021 IEEE International Conference on Image Processing, ICIP 2021 ; 2021-September:170-174, 2021.
Article in English | Scopus | ID: covidwho-1735800

ABSTRACT

With the recent outbreak of COVID-19, ultrasound is fast becoming an inevitable diagnostic tool for regular and continuous monitoring of the lung. However, lung ultrasound (LUS) is unique in the perspective that, the artefacts created by acoustic wave propagation is aiding clinicians in diagnosis. In this work, a novel approach is presented to extract acoustic wave propagation driven features such as acoustic shadows, local phase-based feature symmetry, and integrated backscattering to automatically detect the pleura and to aid a pretrained neural network to classify the severity of lung infection based on the region below pleura. A detailed analysis of the proposed approach on LUS images over the infection to full recovery period of ten confirmed COVID-19 subjects across 400 videos shows an average five-fold cross-validation accuracy, sensitivity, and specificity of 97%, 92%, and 98% respectively over randomly selected 5000 frames. The results and analysis show that, when the input dataset is limited and diverse as in the case of COVID-19 pandemic, an aided effort of combining acoustic propagation-based features along with the gray scale images, as proposed in this work, improves the performance of the neural network significantly even when tested against a completely new data acquisition. © 2021 IEEE.

2.
arxiv; 2021.
Preprint in English | PREPRINT-ARXIV | ID: ppzbmed-2106.06980v1

ABSTRACT

Ultrasound is fast becoming an inevitable diagnostic tool for regular and continuous monitoring of the lung with the recent outbreak of COVID-19. In this work, a novel approach is presented to extract acoustic propagation-based features to automatically highlight the region below pleura, which is an important landmark in lung ultrasound (LUS). Subsequently, a multichannel input formed by using the acoustic physics-based feature maps is fused to train a neural network, referred to as LUSNet, to classify the LUS images into five classes of varying severity of lung infection to track the progression of COVID-19. In order to ensure that the proposed approach is agnostic to the type of acquisition, the LUSNet, which consists of a U-net architecture is trained in an unsupervised manner with the acoustic feature maps to ensure that the encoder-decoder architecture is learning features in the pleural region of interest. A novel combination of the U-net output and the U-net encoder output is employed for the classification of severity of infection in the lung. A detailed analysis of the proposed approach on LUS images over the infection to full recovery period of ten confirmed COVID-19 subjects shows an average five-fold cross-validation accuracy, sensitivity, and specificity of 97%, 93%, and 98% respectively over 5000 frames of COVID-19 videos. The analysis also shows that, when the input dataset is limited and diverse as in the case of COVID-19 pandemic, an aided effort of combining acoustic propagation-based features along with the gray scale images, as proposed in this work, improves the performance of the neural network significantly and also aids the labelling and triaging process.


Subject(s)
COVID-19
SELECTION OF CITATIONS
SEARCH DETAIL